Unsupervised Kernel Dimension Reduction Supplemental Material

نویسندگان

  • Meihong Wang
  • Fei Sha
  • Michael I. Jordan
چکیده

We start by noting that conditional independence X ⊥ Y |B⊤X does not necessarily imply the correlation between BX and Y is maximized. To see this, let X be a Gaussian random vairable with zero mean and diagonal covariance matrix. AssumeB is an identity matrix and Y = X = (BX) (elementwise square for a vectorial X). The conditional independence is obviously satisfied yet the correlation between BX and Y is zero. This observation is yet another example showing the limitation of Spearson’s correlation measures, which detect only linear dependence between random variables. In the following, we show that when measured in the RKHS, the two measures ĴY Y |X and ĴXY are equivalent. Assume we use Gaussian RBF kernel for both ĴY Y |X(B X,Y ) and ĴXY (B X,Y ): K(xi,xj) = exp ( −‖xi − xj‖/σ N )

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unsupervised Kernel Dimension Reduction

We apply the framework of kernel dimension reduction, originally designed for supervised problems, to unsupervised dimensionality reduction. In this framework, kernel-based measures of independence are used to derive low-dimensional representations that maximally capture information in covariates in order to predict responses. We extend this idea and develop similarly motivated measures for uns...

متن کامل

Unsupervised Multiple Kernel Learning

Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimens...

متن کامل

Evolutionary Unsupervised Kernel Regression

Dimension reduction and manifold learning play an important role in robotics, multimedia processing and data mining. For these tasks strong methods like Unsupervised Kernel Regression [4, 7] or Gaussian Process Latent Variable Models [5, 6] have been proposed in the last years. But many methods suffer from numerous local optima and crucial parameter dependencies. We use advanced methods from st...

متن کامل

Regression on manifolds using kernel dimension reduction

We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads of research. On the one hand, the literature on sufficient dimension reduction has focused on methods for finding the best linear subspace for nonlinear regression; we extend this to manifolds. On the other hand, the l...

متن کامل

Unsupervised Nonlinear Feature Extraction Method and Its Effects on Target Detection in High-dimensional Data

The principal component analysis (PCA) is one of the most effective unsupervised techniques for feature extraction. To extract higher order properties of data, researchers extended PCA to kernel PCA (KPCA) by means of kernel machines. In this paper, KPCA is applied as a feature extraction procedure to dimension reduction for target detection as a preprocessing on hyperspectral images. Then the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010